home *** CD-ROM | disk | FTP | other *** search
- Path: osse.nrl.navy.mil!jung
- From: jung@osse.nrl.navy.mil (Greg Jung)
- Newsgroups: comp.sys.amiga.programmer,uiuc.class.cs110c,uiuc.class.cs223
- Subject: Re: How do I define my integers?
- Date: 17 Jan 1996 20:30:22 GMT
- Organization: NRL, Washington, D.C. - Code 7650
- Sender: jung@osse.nrl.navy.mil ()
- Distribution: world
- Message-ID: <4djm8u$rh9@ra.nrl.navy.mil>
- References: <4d73ug$25v@vixen.cso.uiuc.edu> <D33Xx*bB0@mkmk.in-chemnitz.de>
- Reply-To: jung@osse.nrl.navy.mil ()
- NNTP-Posting-Host: ossen.nrl.navy.mil
- X-Newsreader: mxrn 6.18-32
-
-
- In article <D33Xx*bB0@mkmk.in-chemnitz.de>, floh@mkmk.in-chemnitz.de (Andre Weissflog) writes:
- |>In article <4d73ug$25v@vixen.cso.uiuc.edu>, howard daniel joseph writes:
- |>
- |>> Okay. I'm writing a program under GCC on my Amiga.
- Since we're talking GCC, we're hoping for a cross-platform type of
- declaration, also in a widely accepted form. Several answers have been
- given to the elementary form of the question. However the advancing
- technology makes for a twoer of bable between code applied to the
- standard paradigm, (short = 16bits, long=32 bits, quad=64bits, byte=8bits)
- connected to currently popular computers, and other architectures.
-
- I've just been editing code where the size of the integer does
- matter explicitly, bytes might be swapped depending on machine,
- and it reads real data in 16-bit chunks. The lamers claimed to have
- written it in '92 or '93 it used old-style function declarations
- and everywhere "int" is assumed to be 16 bits. "short" is never
- used.
- |>>
- |>The only rule about int types in C is IMHO that
- |>
- |> short <= int <= long
- |>
- This is insufficient for my purposes.
- I need a symbol that
- says "8 bits, like a char", "16 bits, like a signed int",
- "32 bits, like a (un)signed integer".
-
- |>With most Amiga compilers, "short" holds 16 bit accuracy,
- |>"int" and "long" hold 32 bits. On the PC, older "16-bit compilers"
- |>have often 16 bit ints.
-
- By customary usage, "int" should be the size of the native pointer,
- maybe, "long","short" should be arithmetic types, and "int" is
- just a counter/arbitrary integer. That's what I adopt but it is
- still left ambiguous when you know that when someone on a Cray says "word"
- it is 64 bits, when he(/she - 1%) says "double" that means 128 bits.
- |>
- |>The best solution is to consistently use the types defined in
- |><exec/types.h>, called
- |>
- |> BYTE - signed (8-bit) byte
- |> UBYTE - unsigned (8-bit) byte
- |> WORD - the same for 16 bit
- |> UWORD
- |> LONG - the same for 32 bits
- |> ULONG
-
- This would be fine, but it is relevant to general C-code not just
- amiga-specific structures that it is important. To say that UWORD
- is not compiler dependent is bogus because it is only used on the
- Amiga.
-
- Greg
-